PCA-Kernel Estimation

نویسندگان

  • Gérard Biau
  • André Mas
چکیده

Many statistical estimation techniques for high-dimensional or functional data are based on a preliminary dimension reduction step, which consists in projecting the sample X1, . . . ,Xn onto the first D eigenvectors of the Principal Component Analysis (PCA) associated with the empirical projector Π̂D. Classical nonparametric inference methods such as kernel density estimation or kernel regression analysis are then performed in the (usually small) D-dimensional space. However, the mathematical analysis of this data-driven dimension reduction scheme raises technical problems, due to the fact that the random variables of the projected sample (Π̂DX1, . . . , Π̂DXn) are no more independent. As a reference for further studies, we offer in this paper several results showing the asymptotic equivalencies between important kernel-related quantities based on the empirical projector and its theoretical counterpart. As an illustration, we provide an in-depth analysis of the nonparametric kernel regression case.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kernel Principal Components Are Maximum Entropy Projections

Principal Component Analysis (PCA) is a very well known statistical tool. Kernel PCA is a nonlinear extension to PCA based on the kernel paradigm. In this paper we characterize the projections found by Kernel PCA from a information theoretic perspective. We prove that Kernel PCA provides optimum entropy projections in the input space when the Gaussian kernel is used for the mapping and a sample...

متن کامل

Spacecraft Pose Estimation using Principal Component Analysis and a Monocular Camera

The method of Principal Components Analysis (PCA) is widely used in statistical data analysis for engineering and the sciences. It is an effective tool for reducing the dimensionality of datasets while retaining majority of the data information. This paper explores the method of using PCA for spacecraft pose estimation for the purpose of proximity operations, and adapts a novel kernel based PCA...

متن کامل

Kernel Methods for Unsupervised Learning Kernel Methods for Unsupervised Learning Title: Kernel Methods for Unsupervised Learning

Kernel Methods are algorithms that projects input data by a nonlinear mapping in a new space (Feature Space). In this thesis we have investigated Kernel Methods for Unsupervised learning, namely Kernel Methods that do not require targeted data. Two classical unsupervised learning problems using Kernel Methods have been tackled. The former is the Data Dimensionality Estimation, the latter is the...

متن کامل

Eigenvoice Speaker Adaptation via Composite Kernel PCA

Eigenvoice speaker adaptation has been shown to be effective when only a small amount of adaptation data is available. At the heart of the method is principal component analysis (PCA) employed to find the most important eigenvoices. In this paper, we postulate that nonlinear PCA, in particular kernel PCA, may be even more effective. One major challenge is to map the feature-space eigenvoices ba...

متن کامل

Fault Detection Approach Based on Weighted Principal Component Analysis Applied to Continuous Stirred Tank Reactor

Fault detection approach based on principal component analysis (PCA) may perform not well when the process is time-varying, because it can cause unfavorable influence on feature extraction. To solve this problem, a modified PCA which considering variance maximization is proposed, referred to as weighted PCA (WPCA). WPCA can obtain the slow features information of observed data in time-varying s...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010